Web Survey Bibliography
Abstract #1:
The growing usage of smartphone applications (or “apps”), particularly among young adults, has opened a new frontier for data collection. This emerging method of omputer-Assisted Self-Interviewing (CASI) offers new techniques to engage respondents on the mobile platform in response to the persistent challenge of respondent cooperation. The use of game mechanics has been integrated with smartphone apps in the recent years to draw on the intrinsic motivation for users to engage in a task. The tools for game mechanics such as points, badges, levels, challenges and leaderboards are used to motivate desired behaviors (i.e., “gamifying” the process but not necessarily turning the task completely into a “game”). Moreover, “social sharing” on networks such as Facebook is a defining attribute for today’s youth and a critical feature of some of the most successful apps. The mechanics of “social sharing” such as comments, posting updates or “liking” the status of others are engaging features to connect the users within the app community and social networks such as Facebook. Leveraging both game and social echanics for mobile app research can maximize respondent engagement for longitudinal data collection. To measure these emerging techniques for engagement, Nielsen will conduct a split sample experiment contrasting two versions of the iPhone app to collect media usage information. One version of the app will be fully integrated with game and social mechanics while the other version will be initiated without these features, then add the game and social mechanics in phases. This experiment is expected to gather learning on the effectiveness of these emerging techniques for respondent engagement and demonstrate whether data collection via smartphone app is a viable method for repeated measures of the hard-to-reach younger cohorts.
Abstract #2:
This research explores the utility of Facebook applications as survey and passive data collection platforms. Facebook applications are interactive, user-facing tools that allow users to interact and enhance the user experience through social games, quizzes, and other interactive and social tools. Facebook’s Graph Application Programming Interface offers researchers the tools necessary to develop applications that can allow access to both public and private data (provided access is permitted), which in turn offer data collection capabilities survey researchers are now only beginning to understand. Traditional survey questionnaires constrain not only the type of data that can be collected, but also the volume, accuracy, and timeliness of the data and data collection process. Facebook’s Graph API is revolutionizing the way we conceptualize the word “data,” which has major implications across a variety of dimensions directly related to the field of survey research. For instance, Facebook offers the capability to stream data in real-time, drawing from a user-base of over 800 million, and in forms that are both new (e.g. location check-in, social networks, and status updates) and old (e.g. demographic data). Facebook applications offer researchers an opportunity to develop unique approaches to address research questions. These applications also provide a platform for questionnaire administration, data creation, and passive data collection in real-time. This paper explores the applicability of Facebook applications, such as social gaming and Facebook user-experience-enhancing applications, to survey research. Results from a pilot study involving the use of a Facebook application to engage the social networks of military personnel to build registries will be used to explore the potential uses of applications. Specifically, this research aims to provide a better understanding of the new types of data being developed, Facebook applications as a mode of questionnaire administration, participant recruitment, implications to sample development, and limitations to be addressed going forward.
Abstract #3:
This research explores the utility of Facebook applications as survey and passive data collection platforms. Facebook applications are interactive, user-facing tools that allow users to interact and enhance the user experience through social games, quizzes, and other interactive and social tools. Facebook’s Graph Application Programming Interface offers researchers the tools necessary to develop applications that can allow access to both public and private data (provided access is permitted), which in turn offer data collection capabilities survey researchers are now only beginning to understand. Traditional survey questionnaires constrain not only the type of data that can be collected, but also the volume, accuracy, and timeliness of the data and data collection process. Facebook’s Graph API is revolutionizing the way we conceptualize the word “data,” which has major implications across a variety of dimensions directly related to the field of survey research. For instance, Facebook offers the capability to stream data in real-time, drawing from a user-base of over 800 million, and in forms that are both new (e.g. location check-in, social networks, and status updates) and old (e.g. demographic data). Facebook applications offer researchers an opportunity to develop unique approaches to address research questions. These applications also provide a platform for questionnaire administration, data creation, and passive data collection in real-time. This paper explores the applicability of Facebook applications, such as social gaming and Facebook user-experience-enhancing applications, to survey research. Results from a pilot study involving the use of a Facebook application to engage the social networks of military personnel to build registries will be used to explore the potential uses of applications. Specifically, this research aims to provide a better understanding of the new types of data being developed, Facebook applications as a mode of questionnaire administration, participant recruitment, implications to sample development, and limitations to be addressed going forward.
Abstract #4:
The Randomized Response Technique (RRT) is used to encourage accurate responding to sensitive survey questions. When using the RRT, respondents are given two questions (one sensitive and the other nonsensitive with a known response distribution) and are instructed to answer one of them. The question to be answered is determined by the outcome of a random act with a known probability (e.g. a coin toss), that only the respondent sees. Researchers do not know which question each respondent answered, but are able to calculate proportions for each response to the sensitive question. Though it is designed to reduce error, the RRT may actually increase measurement error if respondents implement it incorrectly. Evaluating the RRT is challenging because the outcome of its driving feature, the randomizer, is concealed from researchers. As a result, prior research has typically assumed that higher reporting of undesirable responses signals the RRT’s success. Eight RRT items were evaluated in a non-probability survey of 75 participants of the online virtual world, Second Life (SL). Participants were randomly assigned to one of three modes: face-to-face interview in SL, voice chat interview in SL, or web. The randomizer across all modes was an interactive, 3-dimensional virtual coin toss that was discreetly manipulated by the researchers in order to determine with near certainty whether participants followed the procedure. Only 67% of participants followed the procedure for every RRT item. The greatest rate of procedural noncompliance on an item was 13%. In a true application of the RRT, such noncompliance would result in greatly inflated estimates. There were no significant differences in RRT compliance by demographic characteristics or survey mode. Most participants indicated in debriefing questions that they enjoyed this method of answering questions, but their noncompliance is cause for additional skepticism about using the RRT.
Conference Homepage (abstract)
Web survey bibliography - Marketing/business (336)
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Virtual reality meets sensory research; 2017; Depoortere, L.
- Online customer journey analysis: a data science toolbox; 2017; Bonnay, D.
- Comparing Twitter and Online Panels for Survey Recruitment of E-Cigarette Users and Smokers; 2016; Guillory, J.; Kim, A.; Murphy, J.; Bradfield, B.; Nonnemaker, J.; Hsieh, Y. P.
- Statistical Design for Online Experiments Across Desktops, Tablets, Smartphones (and Maybe Wearable...; 2016; Qian, P.; Sadeghi, S.; Arora, N. K.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- A look at the unique data-gathering process behind the Harvard Impact Study; 2016; Vitale, J.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk; 2016; Berinsky, A.; Huber, G. A.; Lenz, G. S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- The role of gamification in better accessing reality and hence increasing data validity ; 2015; Bailey, P.; Kernohan, H.; Pritchard, G.
- Rewarding the Truth; 2015; Puleston, J.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Email subject lines and response rates to invitations to participate in a web survey and a face-to-face...; 2015; Sappleton, N.; Lourenco, F.
- Can a non-probabilistic online panel achieve question quality similar to that of the European Social...; 2015; Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C.
- Mode Effects in Mixed-Mode Economic Surveys: Insights from a Randomized Experiment; 2015; Hsu, J. W.; McFall, B. H.
- Web-based survey, calibration, and economic impact assessment of spending in nature based recreation; 2015; Paudel, K. P., Devkota, N., Gyawali, B.
- The Influence of Answer Box Format on Response Behavior on List-Style Open-Ended Questions; 2014; Keusch, F.
- Improving Survey Response Rates in Online Panels Effects of Low-Cost Incentives and Cost-Free Text Appeal...; 2014; Pedersen, M. J., Nielsen, C. V.
- Matrix versus paging designs in a brand attribution task; 2014; Conrad, F. G., McCullough, W., Nishimura, R.
- Internet-Based Surveys: Methodological Issues; 2014; Albaum, G., Brockett, P., Golden, L., Han, V., Roster, C. A., Smith, S. M., Wiley, J. B.
- Use of a Google Map Tool Embedded in an Internet Survey Instrument: Is it a Valid and Reliable Alternative...; 2014; Dasgupta, S., Vaughan, A. S., Kramer, M. R., Sanchez, T. H., Sullivan, P. S.
- Sequential or Simultaneous Multi-Mode? Results from Two Large Surveys of Electric Utility Consumers; 2014; Jackson, C., Ledoux, C.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Clicking vs. Dragging: Different Uses of the Mouse and Their Implications for Online Surveys; 2014; Sikkel, D., Steenbergen, R., Gras, S.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.
- How Sliders Bias Survey Data; 2013; Sellers, R.
- Survey Research Response Rates: Internet Technology vs. Snail Mail ; 2013; Lanier, P. A., Tanner, J. R., Totaro, M. W., Gradnigo, G.
- The impact of New Zealand's 2008 prohibition of piperazine-based party pills on young people'...; 2013; Sheridan, J., Dong, C. Y., Butler, R., Barnes, J.
- How well do volunteer web panel surveys measure sensitive behaviours in the general population, and...; 2013; Erens, B., Burkill, S., Copas, A., Couper, M. P., Conrad, F.
- Effects of Gamification on Participation and Data Quality in a Real-World Market Research Domain ; 2013; Cechanowicz, J., Gutwin, C., Brownell, B., Goodfellow, L.
- Ideal participants in online market research: Lessons from closed communities; 2013; Heinze, A., Ferneley, E., Child, P.
- Online, face-to-face and telephone surveys—Comparing different sampling methods in wine consumer...; 2013; Szolnoki, G., Hoffmann, D.
- Where does the Fair Trade price premium go? Confronting consumers' request with reality; 2013; Langen, N., Adenaeuer, L.
- Customer satisfaction in Web 2.0 and information technology development; 2013; Sharma, G., Baoku, L.
- Research staff and public engagement: a UK study; 2013; Davies, S.